Nested Inequalities Among Divergence Measures
نویسنده
چکیده
In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more are considered are due to mean divergences. Pranesh and Johnson [6] and Jain and Srivastava [3] studied different kind of divergence measures. We have considered measures arising due to differences of single inequality having 11 divergence measures in terms of a sequence. Based on these differences we have obtained many inequalities. These inequalities are kept as nested or sequential forms. Some reverse inequalities and equivalent versions are also studied.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملRefinement Inequalities among Symmetric Divergence Measures
There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discriminatio...
متن کاملSequence of inequalities among fuzzy mean difference divergence measures and their applications
This paper presents a sequence of fuzzy mean difference divergence measures. The validity of these fuzzy mean difference divergence measures is proved axiomatically. In addition, it introduces a sequence of inequalities among some of these fuzzy mean difference divergence measures. The applications of proposed fuzzy mean difference divergence measures in the context of pattern recognition have ...
متن کاملA Sequence of Inequalities among Difference of Symmetric Divergence Measures
In this paper we have considered two one parametric generalizations. These two generalizations have in particular the well known measures such as: J-divergence, Jensen-Shannon divergence and arithmetic-geometric mean divergence. These three measures are with logarithmic expressions. Also, we have particular cases the measures such as: Hellinger discrimination, symmetric χ2−divergence, and trian...
متن کاملSome Inequalities Among New Divergence Measures
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1111.6372 شماره
صفحات -
تاریخ انتشار 2011